276 research outputs found

    LHCb base-line level-0 trigger 3D-flow implementation

    Get PDF
    The LHCb Level-0 trigger implementation with the 3D-Flow system offers full programmability, allowing it to adapt to unexpected operating conditions and enabling new, unpredicted physics. The implementation is described in detail and refers to components and technology available today. The 3D-Flow Processor system is a new, technology-independent concept in very fast, real-time system architectures. Based on the replication of a single type of circuit of 100 k gates, which communicates in six directions: bi-directional with North, East, West, and South neighbors, unidirectional from Top to Bottom, the system offers full programmability, modularity, ease of expansion and adaptation to the latest technology. A complete study of its applicability to the LHCb calorimeter triggers is presented. Full description of the input data handling, either in digital or mixed digital-analog form, of the data processing, and the transmission of results to the global level-0 trigger decision unit are provided. Any level-0 trigger algorithm (2*2, 3*3, 4*4, etc.) with up to 20 steps, can be implemented with zero dead-time, while sustaining input data rate (up to 32-bit per input channel, per bunch crossing) at 40 MHz. For each step, each 3D-Flow processor can execute up to 26 operations, inclusive of compare, ranging, finding local maxima, and efficient data exchange with neighboring channels. (One-to-one correspondence between input channel and trigger tower.) Populated with only two main types of components, front-end FPGAs and 3D-Flow processors, a single type of board, it is shown how the whole Level-0 calorimeter trigger can be accommodated into six crates (9U), each containing 16 identical boards. All 3D-Flow inter-chip Bottom to Top ports connection are all contained on the board (data are multiplexed 2:1, PCB traces are shorter than 6 cm); all 3D-flow inter-chip North, East, West, and South ports connections, between boards and crates, are multiplexed (8+2):1 and are shorter than 1.5 m. Full implementation of a 3D-Flow system, for the most complex trigger algorithm, requires 320 cables to north and south crates and 40 cables to east and west crates (Cable cost=$2 each). For applications requiring a simpler real-time algorithm (e.g., requiring less than 20 steps, which is equivalent to 10 layers of 3D-Flow- processors), then the number of connections for the inter-boards (North and South), and inter-crates (East and West) will also be reduced to the number of layers used by the simpler algorithm, thus not requiring to install all cables (e.g., applications requiring only nine layers of 3D-Flow processors will save 32 cables to the North, 32 to the South, four to the East, and four to the West crates). Details are also given on timing and synchronization issues, ASIC design verification, real-time performance monitoring and design (software and hardware) development tools. (37 refs)

    Satellite interferometric data for landslide intensity evaluation in mountainous regions

    Get PDF
    Multi-Temporal Interferometric Synthetic Aperture Radar (MTInSAR) data offer a valuable support to landslide mapping and to landslide activity estimation in mountain environments, where in situ measures are sometimes difficult to gather. Nowadays, the interferometric approach is more and more used for wide-areas analysis, providing useful information for risk management actors but at the same time requiring a lot of efforts to correctly interpret what satellite data are telling us. In this context, hot-spot-like analyses that select and highlight the fastest moving areas in a region of interest, are a good operative solution for reducing the time needed to inspect a whole interferometric dataset composed by thousands or millions of points. In this work, we go beyond the concept of MTInSAR data as simple mapping tools by proposing an approach whose final goal is the quantification of the potential loss experienced by an element at risk hit by a potential landslide. To do so, it is mandatory to evaluate landslide intensity. Here, we estimate intensity using Active Deformation Areas (ADA) extracted from Sentinel-1 MTInSAR data. Depending on the localization of each ADA with respect to the urban areas, intensity is derived in two different ways. Once exposure and vulnerability of the elements at risk are estimated, the potential loss due to a landslide of a given intensity is calculated. We tested our methodology in the Eastern Valle d'Aosta (north-western Italy), along four lateral valleys of the Dora Baltea Valley. This territory is characterized by steep slopes and by numerous active and dormant landslides. The goal of this work is to develop a regional scale methodology based on satellite radar interferometry to assess the potential impact of landslides on the urban fabric

    Elicitation of Preferences under Ambiguity

    Get PDF
    This paper is about behaviour under ambiguity ‒ that is, a situation in which probabilities either do not exist or are not known. Our objective is to find the most empirically valid of the increasingly large number of theories attempting to explain such behaviour. We use experimentally-generated data to compare and contrast the theories. The incentivised experimental task we employed was that of allocation: in a series of problems we gave the subjects an amount of money and asked them to allocate the money over three accounts, the payoffs to them being contingent on a ‘state of the world’ with the occurrence of the states being ambiguous. We reproduced ambiguity in the laboratory using a Bingo Blower. We fitted the most popular and apparently empirically valid preference functionals [Subjective Expected Utility (SEU), MaxMin Expected Utility (MEU) and α­-MEU], as well as Mean-Variance (MV) and a heuristic rule, Safety First (SF). We found that SEU fits better than MV and SF and only slightly worse than MEU and α­-MEU

    Modeling double strand break susceptibility to interrogate structural variation in cancer

    Get PDF
    Abstract Background Structural variants (SVs) are known to play important roles in a variety of cancers, but their origins and functional consequences are still poorly understood. Many SVs are thought to emerge from errors in the repair processes following DNA double strand breaks (DSBs). Results We used experimentally quantified DSB frequencies in cell lines with matched chromatin and sequence features to derive the first quantitative genome-wide models of DSB susceptibility. These models are accurate and provide novel insights into the mutational mechanisms generating DSBs. Models trained in one cell type can be successfully applied to others, but a substantial proportion of DSBs appear to reflect cell type-specific processes. Using model predictions as a proxy for susceptibility to DSBs in tumors, many SV-enriched regions appear to be poorly explained by selectively neutral mutational bias alone. A substantial number of these regions show unexpectedly high SV breakpoint frequencies given their predicted susceptibility to mutation and are therefore credible targets of positive selection in tumors. These putatively positively selected SV hotspots are enriched for genes previously shown to be oncogenic. In contrast, several hundred regions across the genome show unexpectedly low levels of SVs, given their relatively high susceptibility to mutation. These novel coldspot regions appear to be subject to purifying selection in tumors and are enriched for active promoters and enhancers. Conclusions We conclude that models of DSB susceptibility offer a rigorous approach to the inference of SVs putatively subject to selection in tumors

    Evaluating the spatial uncertainty of future land abandonment in a mountain valley (Vicdessos, Pyrenees-France) : insights form model parameterization and experiments

    Get PDF
    International audienceEuropean mountains are particularly sensitive to climatic disruptions and land use changes. The latter leads to high rates of natural reforestation over the last 50 years. Faced with the challenge of predicting possible impacts on ecosystem services, LUCC models offer new opportunities for land managers to adapt or mitigate their strategies. Assessing the spatial uncertainty of future LUCC is crucial for the defintion of sustainable land use strategies. However, the sources of uncertainty may differ, including the input parameters, the model itself, and the wide range of possible futures. The aim of this paper is to propose a method to assess the probability of occurrence of future LUCC that combines the inherent uncertainty of model parameterization and the ensemble uncertainty of the future based scenarios. For this purpose, we used the Land Change Modeler tool to simulate future LUCC on a study site located in the Pyrenees Mountains (France) and 2 scenarios illustratins 2 land use strategies. The model was parameterized with the same driving factors used for its calibration. The defintion of static vs. dynamic and quantitative vs. qualitative (discretized) driving factors, and their combination resulted in 4 parameterizations. The combination of model outcomes produced maps of spatial uncertainty of future LUCC. This work involves literature to future-based LUCC studies. It goes beyond the uncertainty of simulation models by integrating the unceertainty of the future to provide maps to help decision makers and land managers
    corecore